2025-04-01 22:31:19,081 [ 72856 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse (runner:53, check_args_and_update_paths) 2025-04-01 22:31:19,081 [ 72856 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:97, check_args_and_update_paths) 2025-04-01 22:31:19,081 [ 72856 ] INFO : utils dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/utils (runner:108, check_args_and_update_paths) 2025-04-01 22:31:19,081 [ 72856 ] INFO : base_configs_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:110, check_args_and_update_paths) clickhouse_integration_tests_volume Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_pcfdyn --privileged --dns-search='.' --memory=30709030912 --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=6712d5cc610d -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=caad4729259e -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=0 --color=no --durations=0 test_MemoryTracking/test.py::test_http test_MemoryTracking/test.py::test_tcp_multiple_sessions test_MemoryTracking/test.py::test_tcp_single_session 'test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree]' test_allowed_url_from_config/test.py::test_HDFS test_allowed_url_from_config/test.py::test_config_with_hosts test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts test_allowed_url_from_config/test.py::test_config_without_allowed_hosts test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section test_allowed_url_from_config/test.py::test_redirect test_allowed_url_from_config/test.py::test_schema_inference test_allowed_url_from_config/test.py::test_table_function_remote test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields test_attach_partition_using_copy/test.py::test_all_replicated test_attach_partition_using_copy/test.py::test_both_mergetree test_attach_partition_using_copy/test.py::test_not_work_on_different_disk test_attach_partition_using_copy/test.py::test_only_destination_replicated test_azure_blob_storage_plain_rewritable/test.py::test_drop_table test_azure_blob_storage_plain_rewritable/test.py::test_insert_select test_azure_blob_storage_plain_rewritable/test.py::test_restart_server test_backup_restore_azure_blob_storage/test.py::test_backup_restore test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree]' test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path 'test_backup_restore_storage_policy/test.py::test_storage_policies[None--default]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2]' test_backup_s3_storage_class/test.py::test_backup_s3_storage_class test_backward_compatibility/test.py::test_backward_compatability1 test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000]' 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000]' 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000]' 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000]' test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic test_backward_compatibility/test_functions.py::test_aggregate_states test_backward_compatibility/test_functions.py::test_string_functions test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts test_broken_part_during_merge/test.py::test_merge_and_part_corruption test_buffer_profile/test.py::test_buffer_profile test_buffer_profile/test.py::test_default_profile test_build_sets_from_multiple_threads/test.py::test_set test_catboost_evaluate/test.py::testAmazonModelManyRows test_catboost_evaluate/test.py::testAmazonModelSingleRow test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString test_catboost_evaluate/test.py::testConstantFeatures test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric test_catboost_evaluate/test.py::testInvalidLibraryPath test_catboost_evaluate/test.py::testInvalidModelPath test_catboost_evaluate/test.py::testModelPathIsNotAConstString test_catboost_evaluate/test.py::testModelUpdate test_catboost_evaluate/test.py::testNonConstantFeatures test_catboost_evaluate/test.py::testOnLowCardinalityFeatures test_catboost_evaluate/test.py::testOnNullableFeatures test_catboost_evaluate/test.py::testRecoveryAfterCrash test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop test_codec_encrypted/test.py::test_different_keys test_composable_protocols/test.py::test_connections test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide test_concurrent_queries_for_user_restriction/test.py::test_exception_message test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool test_config_decryption/test_wrong_settings.py::test_invalid_chars test_config_decryption/test_wrong_settings.py::test_no_encryption_key test_config_decryption/test_wrong_settings.py::test_subnodes test_config_decryption/test_wrong_settings.py::test_wrong_method test_config_xml_yaml_mix/test.py::test_extra_yaml_mix test_config_yaml_main/test.py::test_yaml_main_conf -vvv" altinityinfra/integration-tests-runner:cd6390247eca '. Start tests ============================= test session starts ============================== platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: random-0.2, timeout-2.2.0, repeat-0.9.3, order-1.0.0, reportlog-0.4.0, xdist-3.5.0 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [100 items] scheduling tests via LoadFileScheduling test_catboost_evaluate/test.py::testAmazonModelManyRows test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes test_allowed_url_from_config/test.py::test_HDFS test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg test_config_decryption/test_wrong_settings.py::test_invalid_chars test_attach_partition_using_copy/test.py::test_all_replicated test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool test_backup_restore_azure_blob_storage/test.py::test_backup_restore test_MemoryTracking/test.py::test_http test_backup_restore_storage_policy/test.py::test_storage_policies[None--default] [gw8] [ 1%] PASSED test_config_decryption/test_wrong_settings.py::test_invalid_chars test_config_decryption/test_wrong_settings.py::test_no_encryption_key [gw1] [ 2%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[None--default] test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default] [gw1] [ 3%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default] test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1] [gw1] [ 4%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1] test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default] [gw1] [ 5%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default] test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1] [gw1] [ 6%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1] test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1] [gw1] [ 7%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1] test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2] [gw1] [ 8%] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2] [gw8] [ 9%] PASSED test_config_decryption/test_wrong_settings.py::test_no_encryption_key test_config_decryption/test_wrong_settings.py::test_subnodes test_azure_blob_storage_plain_rewritable/test.py::test_drop_table [gw9] [ 10%] PASSED test_MemoryTracking/test.py::test_http test_MemoryTracking/test.py::test_tcp_multiple_sessions [gw0] [ 11%] PASSED test_catboost_evaluate/test.py::testAmazonModelManyRows test_catboost_evaluate/test.py::testAmazonModelSingleRow [gw0] [ 12%] PASSED test_catboost_evaluate/test.py::testAmazonModelSingleRow test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString [gw6] [ 13%] PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg [gw0] [ 14%] PASSED test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString test_catboost_evaluate/test.py::testConstantFeatures [gw0] [ 15%] PASSED test_catboost_evaluate/test.py::testConstantFeatures test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric [gw0] [ 16%] PASSED test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric test_catboost_evaluate/test.py::testInvalidLibraryPath [gw8] [ 17%] PASSED test_config_decryption/test_wrong_settings.py::test_subnodes test_config_decryption/test_wrong_settings.py::test_wrong_method [gw0] [ 18%] PASSED test_catboost_evaluate/test.py::testInvalidLibraryPath test_catboost_evaluate/test.py::testInvalidModelPath [gw0] [ 19%] PASSED test_catboost_evaluate/test.py::testInvalidModelPath test_catboost_evaluate/test.py::testModelPathIsNotAConstString test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000] [gw0] [ 20%] PASSED test_catboost_evaluate/test.py::testModelPathIsNotAConstString test_catboost_evaluate/test.py::testModelUpdate [gw0] [ 21%] PASSED test_catboost_evaluate/test.py::testModelUpdate test_catboost_evaluate/test.py::testNonConstantFeatures [gw2] [ 22%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes [gw0] [ 23%] PASSED test_catboost_evaluate/test.py::testNonConstantFeatures test_catboost_evaluate/test.py::testOnLowCardinalityFeatures test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node [gw0] [ 24%] PASSED test_catboost_evaluate/test.py::testOnLowCardinalityFeatures test_catboost_evaluate/test.py::testOnNullableFeatures [gw0] [ 25%] PASSED test_catboost_evaluate/test.py::testOnNullableFeatures test_catboost_evaluate/test.py::testRecoveryAfterCrash [gw8] [ 26%] PASSED test_config_decryption/test_wrong_settings.py::test_wrong_method test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table [gw6] [ 27%] PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000] [gw0] [ 28%] PASSED test_catboost_evaluate/test.py::testRecoveryAfterCrash test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh [gw0] [ 29%] PASSED test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments [gw0] [ 30%] PASSED test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000] test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact [gw9] [ 31%] PASSED test_MemoryTracking/test.py::test_tcp_multiple_sessions test_MemoryTracking/test.py::test_tcp_single_session [gw2] [ 32%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node [gw3] [ 33%] PASSED test_allowed_url_from_config/test.py::test_HDFS test_allowed_url_from_config/test.py::test_config_with_hosts test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree] [gw3] [ 34%] PASSED test_allowed_url_from_config/test.py::test_config_with_hosts test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts [gw7] [ 35%] PASSED test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated [gw3] [ 36%] PASSED test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts [gw3] [ 37%] PASSED test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts test_allowed_url_from_config/test.py::test_config_without_allowed_hosts [gw8] [ 38%] PASSED test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table [gw3] [ 39%] PASSED test_allowed_url_from_config/test.py::test_config_without_allowed_hosts test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path [gw3] [ 40%] PASSED test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section test_allowed_url_from_config/test.py::test_redirect [gw3] [ 41%] PASSED test_allowed_url_from_config/test.py::test_redirect test_allowed_url_from_config/test.py::test_schema_inference [gw3] [ 42%] PASSED test_allowed_url_from_config/test.py::test_schema_inference test_allowed_url_from_config/test.py::test_table_function_remote [gw6] [ 43%] PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000] [gw0] [ 44%] PASSED test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide [gw8] [ 45%] PASSED test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000] [gw0] [ 46%] PASSED test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide [gw9] [ 47%] PASSED test_MemoryTracking/test.py::test_tcp_single_session test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree] test_backward_compatibility/test_functions.py::test_aggregate_states test_buffer_profile/test.py::test_buffer_profile [gw6] [ 48%] PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000] test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000] [gw8] [ 49%] PASSED test_buffer_profile/test.py::test_buffer_profile test_buffer_profile/test.py::test_default_profile [gw8] [ 50%] PASSED test_buffer_profile/test.py::test_default_profile test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge [gw0] [ 51%] PASSED test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree] [gw6] [ 52%] PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000] [gw3] [ 53%] PASSED test_allowed_url_from_config/test.py::test_table_function_remote [gw8] [ 54%] PASSED test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields [gw9] [ 55%] PASSED test_backward_compatibility/test_functions.py::test_aggregate_states test_backward_compatibility/test_functions.py::test_string_functions [gw9] [ 56%] SKIPPED test_backward_compatibility/test_functions.py::test_string_functions test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backward_compatibility/test.py::test_backward_compatability1 [gw8] [ 57%] PASSED test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column [gw7] [ 58%] PASSED test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas [gw6] [ 59%] PASSED test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async [gw3] [ 60%] SKIPPED test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active [gw8] [ 61%] PASSED test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts [gw9] [ 62%] PASSED test_backward_compatibility/test.py::test_backward_compatability1 test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic [gw2] [ 63%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree] [gw4] [ 64%] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log] [gw0] [ 65%] PASSED test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields [gw1] [ 66%] PASSED test_azure_blob_storage_plain_rewritable/test.py::test_drop_table test_azure_blob_storage_plain_rewritable/test.py::test_insert_select test_build_sets_from_multiple_threads/test.py::test_set test_backup_s3_storage_class/test.py::test_backup_s3_storage_class [gw1] [ 67%] PASSED test_azure_blob_storage_plain_rewritable/test.py::test_insert_select test_azure_blob_storage_plain_rewritable/test.py::test_restart_server [gw4] [ 68%] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container [gw4] [ 69%] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree [gw4] [ 70%] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 [gw4] [ 71%] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 [gw4] [ 72%] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 [gw1] [ 73%] PASSED test_azure_blob_storage_plain_rewritable/test.py::test_restart_server [gw3] [ 74%] PASSED test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field [gw0] [ 75%] PASSED test_backup_s3_storage_class/test.py::test_backup_s3_storage_class test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop [gw9] [ 76%] PASSED test_build_sets_from_multiple_threads/test.py::test_set test_config_yaml_main/test.py::test_yaml_main_conf test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop test_broken_part_during_merge/test.py::test_merge_and_part_corruption [gw1] [ 77%] PASSED test_config_yaml_main/test.py::test_yaml_main_conf [gw8] [ 78%] PASSED test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop [gw0] [ 79%] PASSED test_broken_part_during_merge/test.py::test_merge_and_part_corruption [gw7] [ 80%] PASSED test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool [gw7] [ 81%] PASSED test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool [gw9] [ 82%] PASSED test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop [gw3] [ 83%] PASSED test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop test_codec_encrypted/test.py::test_different_keys test_composable_protocols/test.py::test_connections [gw2] [ 84%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log] test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree] [gw3] [ 85%] PASSED test_composable_protocols/test.py::test_connections [gw7] [ 86%] PASSED test_codec_encrypted/test.py::test_different_keys [gw6] [ 87%] PASSED test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic [gw8] [ 88%] PASSED test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column test_concurrent_queries_for_user_restriction/test.py::test_exception_message [gw6] [ 89%] PASSED test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column [gw8] [ 90%] PASSED test_concurrent_queries_for_user_restriction/test.py::test_exception_message test_config_xml_yaml_mix/test.py::test_extra_yaml_mix [gw6] [ 91%] PASSED test_config_xml_yaml_mix/test.py::test_extra_yaml_mix [gw2] [ 92%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree] test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree] [gw2] [ 93%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree] test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree] [gw2] [ 94%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree] test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup [gw2] [ 95%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table [gw2] [ 96%] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table [gw5] [ 97%] FAILED test_attach_partition_using_copy/test.py::test_all_replicated test_attach_partition_using_copy/test.py::test_both_mergetree [gw5] [ 98%] FAILED test_attach_partition_using_copy/test.py::test_both_mergetree test_attach_partition_using_copy/test.py::test_not_work_on_different_disk [gw5] [ 99%] FAILED test_attach_partition_using_copy/test.py::test_not_work_on_different_disk test_attach_partition_using_copy/test.py::test_only_destination_replicated [gw5] [100%] FAILED test_attach_partition_using_copy/test.py::test_only_destination_replicated =================================== FAILURES =================================== _____________________________ test_all_replicated ______________________________ [gw5] linux -- Python 3.10.12 /usr/bin/python3 start_cluster = def test_all_replicated(start_cluster): cleanup([replica1, replica2]) > create_source_table(replica1, "source", True) test_attach_partition_using_copy/test.py:126: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_attach_partition_using_copy/test.py:40: in create_source_table node.query_with_retry( helpers/cluster.py:3713: in query_with_retry result = self.query( helpers/cluster.py:3678: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() helpers/client.py:230: in get_answer self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) /usr/lib/python3.10/subprocess.py:1209: in wait return self._wait(timeout=timeout) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = timeout = 600 def _wait(self, timeout): """Internal implementation of wait() on POSIX.""" if self.returncode is not None: return self.returncode if timeout is not None: endtime = _time() + timeout # Enter a busy loop if we have a timeout. This busy loop was # cribbed from Lib/threading.py in Thread.wait() at r71065. delay = 0.0005 # 500 us -> initial delay of 1 ms while True: if self._waitpid_lock.acquire(False): try: if self.returncode is not None: break # Another thread waited. (pid, sts) = self._try_wait(os.WNOHANG) assert pid == self.pid or pid == 0 if pid == self.pid: self._handle_exitstatus(sts) break finally: self._waitpid_lock.release() remaining = self._remaining_time(endtime) if remaining <= 0: raise TimeoutExpired(self.args, timeout) delay = min(delay * 2, remaining, .05) > time.sleep(delay) E Failed: Timeout >900.0s /usr/lib/python3.10/subprocess.py:1953: Failed ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:122, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Stdout:1 (cluster.py:146, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : No running containers (conftest.py:96, cleanup_environment) 2025-04-01 22:31:25 [ 665 ] DEBUG : Pruning Docker networks (conftest.py:98, cleanup_environment) 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[docker network prune --force] (cluster.py:122, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:122, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:146, run_and_check) 2025-04-01 22:31:25 [ 665 ] INFO : Running tests in /ClickHouse/tests/integration/test_attach_partition_using_copy/test.py (cluster.py:2793, start) 2025-04-01 22:31:25 [ 665 ] DEBUG : Cluster start called. is_up=False (cluster.py:2800, start) 2025-04-01 22:31:25 [ 665 ] DEBUG : Docker networks for project roottestattachpartitionusingcopy-gw5 are NETWORK ID NAME DRIVER SCOPE (cluster.py:873, print_all_docker_pieces) 2025-04-01 22:31:25 [ 665 ] DEBUG : Docker containers for project roottestattachpartitionusingcopy-gw5 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:881, print_all_docker_pieces) 2025-04-01 22:31:25 [ 665 ] DEBUG : Docker volumes for project roottestattachpartitionusingcopy-gw5 are DRIVER VOLUME NAME (cluster.py:889, print_all_docker_pieces) 2025-04-01 22:31:25 [ 665 ] DEBUG : Cleanup called (cluster.py:894, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : Docker networks for project roottestattachpartitionusingcopy-gw5 are NETWORK ID NAME DRIVER SCOPE (cluster.py:873, print_all_docker_pieces) 2025-04-01 22:31:25 [ 665 ] DEBUG : Docker containers for project roottestattachpartitionusingcopy-gw5 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:881, print_all_docker_pieces) 2025-04-01 22:31:25 [ 665 ] DEBUG : Docker volumes for project roottestattachpartitionusingcopy-gw5 are DRIVER VOLUME NAME (cluster.py:889, print_all_docker_pieces) 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[docker container list --all --filter name='^/roottestattachpartitionusingcopy-gw5-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:122, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Unstopped containers: {} (cluster.py:908, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : No running containers for project: roottestattachpartitionusingcopy-gw5 (cluster.py:922, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : Trying to prune unused networks... (cluster.py:928, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : Trying to prune unused images... (cluster.py:944, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[docker image prune -f] (cluster.py:122, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Stderr:Error response from daemon: a prune operation is already running (cluster.py:148, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Exitcode:1 (cluster.py:150, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Trying to prune unused volumes... (cluster.py:953, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:122, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Stdout:1 (cluster.py:146, run_and_check) 2025-04-01 22:31:25 [ 665 ] DEBUG : Volumes pruned: 1 (cluster.py:958, cleanup) 2025-04-01 22:31:25 [ 665 ] DEBUG : Setup directory for instance: replica1 (cluster.py:2813, start) 2025-04-01 22:31:25 [ 665 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4639, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Create directory for common tests configuration (cluster.py:4644, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Copy common configuration from helpers (cluster.py:4664, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Generate and write macros file (cluster.py:4716, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_attach_partition_using_copy/configs/remote_servers.xml'] to /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/configs/config.d (cluster.py:4752, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/database (cluster.py:4769, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/logs (cluster.py:4780, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4864, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Setup directory for instance: replica2 (cluster.py:2813, start) 2025-04-01 22:31:25 [ 665 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4639, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Create directory for common tests configuration (cluster.py:4644, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Copy common configuration from helpers (cluster.py:4664, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Generate and write macros file (cluster.py:4716, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_attach_partition_using_copy/configs/remote_servers.xml'] to /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/configs/config.d (cluster.py:4752, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/database (cluster.py:4769, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/logs (cluster.py:4780, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4864, create_dir) 2025-04-01 22:31:25 [ 665 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw', 'keeper_binary': '/clickhouse', 'keeper_cmd_prefix': 'clickhouse keeper', 'image': 'altinityinfra/integration-test:6712d5cc610d', 'user': '0', 'keeper_fs': 'bind', 'keeper_logs_dir1': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper1/log', 'keeper_config_dir1': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper1/config', 'keeper_db_dir1': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper1/coordination', 'keeper_logs_dir2': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper2/log', 'keeper_config_dir2': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper2/config', 'keeper_db_dir2': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper2/coordination', 'keeper_logs_dir3': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper3/log', 'keeper_config_dir3': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper3/config', 'keeper_db_dir3': '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper3/coordination'} stored in /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env (cluster.py:97, _create_env_file) 2025-04-01 22:31:25 [ 665 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-04-01 22:31:25 [ 665 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-04-01 22:31:25 [ 665 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-04-01 22:31:25 [ 665 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-04-01 22:31:25 [ 665 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-04-01 22:31:25 [ 665 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env --project-name roottestattachpartitionusingcopy-gw5 --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/docker-compose.yml pull] (cluster.py:122, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Stderr: replica1 Skipped - Image is already being pulled by replica2 (cluster.py:148, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Stderr: zoo2 Skipped - Image is already being pulled by replica2 (cluster.py:148, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Stderr: zoo3 Skipped - Image is already being pulled by replica2 (cluster.py:148, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Stderr: zoo1 Skipped - Image is already being pulled by replica2 (cluster.py:148, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Stderr: replica2 Pulling (cluster.py:148, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Stderr: replica2 Pulled (cluster.py:148, run_and_check) 2025-04-01 22:31:36 [ 665 ] DEBUG : Setup ZooKeeper (cluster.py:2854, start) 2025-04-01 22:31:36 [ 665 ] DEBUG : Creating internal ZooKeeper dirs: ['/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper1/log', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper1/config', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper1/coordination', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper2/log', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper2/config', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper2/coordination', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper3/log', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper3/config', '/ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/keeper3/coordination'] (cluster.py:2855, start) 2025-04-01 22:31:36 [ 665 ] DEBUG : Command:[docker compose --project-name roottestattachpartitionusingcopy-gw5 --env-file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --verbose up -d] (cluster.py:122, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr:time="2025-04-01T22:31:36Z" level=trace msg="Docker Desktop integration not enabled" (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Network roottestattachpartitionusingcopy-gw5_default Creating (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Network roottestattachpartitionusingcopy-gw5_default Created (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Creating (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Creating (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Creating (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Created (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Created (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Created (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Starting (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Starting (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Starting (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Started (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Started (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Started (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr:time="2025-04-01T22:31:37Z" level=debug msg="otel error" error="" (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Stderr:time="2025-04-01T22:31:37Z" level=debug msg="otel error" error="" (cluster.py:148, run_and_check) 2025-04-01 22:31:37 [ 665 ] DEBUG : Wait ZooKeeper to start (cluster.py:2466, wait_zookeeper_to_start) 2025-04-01 22:31:37 [ 665 ] DEBUG : get_instance_ip instance_name=zoo1 (cluster.py:2082, get_instance_ip) 2025-04-01 22:31:37 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-zoo1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:37 [ 665 ] DEBUG : get_kazoo_client: zoo1, ip:172.16.5.4, port:2181, use_ssl:False (cluster.py:3341, get_kazoo_client) 2025-04-01 22:31:37 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:37 [ 665 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-04-01 22:31:37 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:37 [ 665 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-04-01 22:31:38 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:38 [ 665 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-04-01 22:31:38 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:38 [ 665 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-04-01 22:31:38 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:38 [ 665 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-04-01 22:31:39 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:39 [ 665 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-04-01 22:31:40 [ 665 ] INFO : Connecting to 172.16.5.4(172.16.5.4):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:40 [ 665 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=10000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-04-01 22:31:40 [ 665 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-04-01 22:31:40 [ 665 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-04-01 22:31:40 [ 665 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-04-01 22:31:40 [ 665 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-04-01 22:31:40 [ 665 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-04-01 22:31:40 [ 665 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-04-01 22:31:40 [ 665 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-04-01 22:31:40 [ 665 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-04-01 22:31:40 [ 665 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-04-01 22:31:40 [ 665 ] DEBUG : get_instance_ip instance_name=zoo2 (cluster.py:2082, get_instance_ip) 2025-04-01 22:31:40 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-zoo2-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:40 [ 665 ] DEBUG : get_kazoo_client: zoo2, ip:172.16.5.2, port:2181, use_ssl:False (cluster.py:3341, get_kazoo_client) 2025-04-01 22:31:40 [ 665 ] INFO : Connecting to 172.16.5.2(172.16.5.2):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:40 [ 665 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=10000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-04-01 22:31:40 [ 665 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-04-01 22:31:40 [ 665 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-04-01 22:31:40 [ 665 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-04-01 22:31:40 [ 665 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-04-01 22:31:40 [ 665 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-04-01 22:31:40 [ 665 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-04-01 22:31:40 [ 665 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-04-01 22:31:41 [ 665 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-04-01 22:31:41 [ 665 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-04-01 22:31:41 [ 665 ] DEBUG : get_instance_ip instance_name=zoo3 (cluster.py:2082, get_instance_ip) 2025-04-01 22:31:41 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-zoo3-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:41 [ 665 ] DEBUG : get_kazoo_client: zoo3, ip:172.16.5.3, port:2181, use_ssl:False (cluster.py:3341, get_kazoo_client) 2025-04-01 22:31:41 [ 665 ] INFO : Connecting to 172.16.5.3(172.16.5.3):2181, use_ssl: False (connection.py:650, _connect) 2025-04-01 22:31:41 [ 665 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=10000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-04-01 22:31:41 [ 665 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-04-01 22:31:41 [ 665 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-04-01 22:31:41 [ 665 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-04-01 22:31:41 [ 665 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-04-01 22:31:41 [ 665 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-04-01 22:31:41 [ 665 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-04-01 22:31:41 [ 665 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-04-01 22:31:41 [ 665 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-04-01 22:31:41 [ 665 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-04-01 22:31:41 [ 665 ] DEBUG : All instances of ZooKeeper started: ('zoo1', 'zoo2', 'zoo3') (cluster.py:2482, wait_zookeeper_nodes_to_start) 2025-04-01 22:31:41 [ 665 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env --project-name roottestattachpartitionusingcopy-gw5 --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/docker-compose.yml up -d --no-recreate') (cluster.py:3200, start) 2025-04-01 22:31:41 [ 665 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env --project-name roottestattachpartitionusingcopy-gw5 --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/docker-compose.yml up -d --no-recreate] (cluster.py:122, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Running (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Running (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Running (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Creating (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Creating (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Created (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Created (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Starting (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Starting (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Started (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Started (cluster.py:148, run_and_check) 2025-04-01 22:31:41 [ 665 ] DEBUG : ClickHouse instance created (cluster.py:3208, start) 2025-04-01 22:31:41 [ 665 ] DEBUG : get_instance_ip instance_name=replica1 (cluster.py:2082, get_instance_ip) 2025-04-01 22:31:41 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-replica1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:41 [ 665 ] DEBUG : get_instance_ip instance_name=replica1 (cluster.py:2092, get_instance_global_ipv6) 2025-04-01 22:31:41 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-replica1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:41 [ 665 ] DEBUG : Waiting for ClickHouse start in replica1, ip: 172.16.5.6... (cluster.py:3216, start) 2025-04-01 22:31:41 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-replica1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:41 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:41 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:42 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3aec84b580312b0684bd3a5b1407abb4f3ca31de17aed8aaf580dcfaa1fab380/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : ClickHouse replica1 started (cluster.py:3220, start) 2025-04-01 22:31:43 [ 665 ] DEBUG : get_instance_ip instance_name=replica2 (cluster.py:2082, get_instance_ip) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-replica2-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : get_instance_ip instance_name=replica2 (cluster.py:2092, get_instance_global_ipv6) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-replica2-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : Waiting for ClickHouse start in replica2, ip: 172.16.5.5... (cluster.py:3216, start) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestattachpartitionusingcopy-gw5-replica2-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : http://localhost:None "GET /v1.46/containers/3943eba3e5f820e94a0bccaa161851c79026256b8d75c5a98e50cd5ef4c8fc86/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-04-01 22:31:43 [ 665 ] DEBUG : ClickHouse replica2 started (cluster.py:3220, start) ----------------------------- Captured stderr call ----------------------------- ~~~~~~~~~~~~~~~~~~~~~ Stack of (139834861565504) ~~~~~~~~~~~~~~~~~~~~~ File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 411, in _perform_spawn reply.run() File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 341, in run self._result = func(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 1160, in _thread_receiver msg = Message.from_io(io) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 567, in from_io header = io.read(9) # type 1, channel 4, payload 4 File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 534, in read data = self._read(numbytes - len(buf)) ------------------------------ Captured log call ------------------------------- 2025-04-01 22:31:43 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica1 (cluster.py:3677, query) 2025-04-01 22:31:43 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica1 (cluster.py:3677, query) 2025-04-01 22:31:43 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica2 (cluster.py:3677, query) 2025-04-01 22:31:44 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica2 (cluster.py:3677, query) 2025-04-01 22:31:44 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:32:38 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:33:32 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:34:28 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:35:23 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:36:20 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:37:16 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:38:10 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:39:06 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:40:03 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:41:00 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:41:55 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:42:50 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:43:47 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:44:45 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:45:40 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = ReplicatedMergeTree('/clickhouse/tables/1/source', 'replica1') ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) _____________________________ test_both_mergetree ______________________________ [gw5] linux -- Python 3.10.12 /usr/bin/python3 start_cluster = def test_both_mergetree(start_cluster): cleanup([replica1, replica2]) > create_source_table(replica1, "source", False) test_attach_partition_using_copy/test.py:104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_attach_partition_using_copy/test.py:40: in create_source_table node.query_with_retry( helpers/cluster.py:3713: in query_with_retry result = self.query( helpers/cluster.py:3678: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() helpers/client.py:230: in get_answer self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) /usr/lib/python3.10/subprocess.py:1209: in wait return self._wait(timeout=timeout) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = timeout = 600 def _wait(self, timeout): """Internal implementation of wait() on POSIX.""" if self.returncode is not None: return self.returncode if timeout is not None: endtime = _time() + timeout # Enter a busy loop if we have a timeout. This busy loop was # cribbed from Lib/threading.py in Thread.wait() at r71065. delay = 0.0005 # 500 us -> initial delay of 1 ms while True: if self._waitpid_lock.acquire(False): try: if self.returncode is not None: break # Another thread waited. (pid, sts) = self._try_wait(os.WNOHANG) assert pid == self.pid or pid == 0 if pid == self.pid: self._handle_exitstatus(sts) break finally: self._waitpid_lock.release() remaining = self._remaining_time(endtime) if remaining <= 0: raise TimeoutExpired(self.args, timeout) delay = min(delay * 2, remaining, .05) > time.sleep(delay) E Failed: Timeout >900.0s /usr/lib/python3.10/subprocess.py:1953: Failed ----------------------------- Captured stderr call ----------------------------- ~~~~~~~~~~~~~~~~~~~~~ Stack of (139834861565504) ~~~~~~~~~~~~~~~~~~~~~ File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 411, in _perform_spawn reply.run() File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 341, in run self._result = func(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 1160, in _thread_receiver msg = Message.from_io(io) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 567, in from_io header = io.read(9) # type 1, channel 4, payload 4 File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 534, in read data = self._read(numbytes - len(buf)) ------------------------------ Captured log call ------------------------------- 2025-04-01 22:46:25 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica1 (cluster.py:3677, query) 2025-04-01 22:46:34 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica1 (cluster.py:3677, query) 2025-04-01 22:46:35 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica2 (cluster.py:3677, query) 2025-04-01 22:46:35 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica2 (cluster.py:3677, query) 2025-04-01 22:46:35 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:47:32 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:48:30 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:49:25 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:50:20 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:51:17 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:52:15 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:53:09 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:54:04 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:55:01 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:55:59 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:56:54 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:57:49 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:58:46 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 22:59:44 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:00:38 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) _______________________ test_not_work_on_different_disk ________________________ [gw5] linux -- Python 3.10.12 /usr/bin/python3 start_cluster = def test_not_work_on_different_disk(start_cluster): cleanup([replica1, replica2]) # Replace and move should not work on replace > create_source_table(replica1, "source", False) test_attach_partition_using_copy/test.py:197: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_attach_partition_using_copy/test.py:40: in create_source_table node.query_with_retry( helpers/cluster.py:3713: in query_with_retry result = self.query( helpers/cluster.py:3678: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() helpers/client.py:230: in get_answer self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) /usr/lib/python3.10/subprocess.py:1209: in wait return self._wait(timeout=timeout) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = timeout = 600 def _wait(self, timeout): """Internal implementation of wait() on POSIX.""" if self.returncode is not None: return self.returncode if timeout is not None: endtime = _time() + timeout # Enter a busy loop if we have a timeout. This busy loop was # cribbed from Lib/threading.py in Thread.wait() at r71065. delay = 0.0005 # 500 us -> initial delay of 1 ms while True: if self._waitpid_lock.acquire(False): try: if self.returncode is not None: break # Another thread waited. (pid, sts) = self._try_wait(os.WNOHANG) assert pid == self.pid or pid == 0 if pid == self.pid: self._handle_exitstatus(sts) break finally: self._waitpid_lock.release() remaining = self._remaining_time(endtime) if remaining <= 0: raise TimeoutExpired(self.args, timeout) delay = min(delay * 2, remaining, .05) > time.sleep(delay) E Failed: Timeout >900.0s /usr/lib/python3.10/subprocess.py:1953: Failed ----------------------------- Captured stderr call ----------------------------- ~~~~~~~~~~~~~~~~~~~~~ Stack of (139834861565504) ~~~~~~~~~~~~~~~~~~~~~ File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 411, in _perform_spawn reply.run() File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 341, in run self._result = func(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 1160, in _thread_receiver msg = Message.from_io(io) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 567, in from_io header = io.read(9) # type 1, channel 4, payload 4 File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 534, in read data = self._read(numbytes - len(buf)) ------------------------------ Captured log call ------------------------------- 2025-04-01 23:01:26 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica1 (cluster.py:3677, query) 2025-04-01 23:01:33 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica1 (cluster.py:3677, query) 2025-04-01 23:01:33 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica2 (cluster.py:3677, query) 2025-04-01 23:01:33 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica2 (cluster.py:3677, query) 2025-04-01 23:01:34 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:02:31 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:03:28 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:04:23 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:05:18 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:06:15 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:07:13 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:08:08 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:09:03 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:10:00 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:10:58 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:11:52 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:12:48 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:13:45 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:14:42 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:15:37 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) _______________________ test_only_destination_replicated _______________________ [gw5] linux -- Python 3.10.12 /usr/bin/python3 start_cluster = def test_only_destination_replicated(start_cluster): cleanup([replica1, replica2]) > create_source_table(replica1, "source", False) test_attach_partition_using_copy/test.py:161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_attach_partition_using_copy/test.py:40: in create_source_table node.query_with_retry( helpers/cluster.py:3713: in query_with_retry result = self.query( helpers/cluster.py:3678: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() helpers/client.py:230: in get_answer self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) /usr/lib/python3.10/subprocess.py:1209: in wait return self._wait(timeout=timeout) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = timeout = 600 def _wait(self, timeout): """Internal implementation of wait() on POSIX.""" if self.returncode is not None: return self.returncode if timeout is not None: endtime = _time() + timeout # Enter a busy loop if we have a timeout. This busy loop was # cribbed from Lib/threading.py in Thread.wait() at r71065. delay = 0.0005 # 500 us -> initial delay of 1 ms while True: if self._waitpid_lock.acquire(False): try: if self.returncode is not None: break # Another thread waited. (pid, sts) = self._try_wait(os.WNOHANG) assert pid == self.pid or pid == 0 if pid == self.pid: self._handle_exitstatus(sts) break finally: self._waitpid_lock.release() remaining = self._remaining_time(endtime) if remaining <= 0: raise TimeoutExpired(self.args, timeout) delay = min(delay * 2, remaining, .05) > time.sleep(delay) E Failed: Timeout >900.0s /usr/lib/python3.10/subprocess.py:1953: Failed ----------------------------- Captured stderr call ----------------------------- ~~~~~~~~~~~~~~~~~~~~~ Stack of (139834861565504) ~~~~~~~~~~~~~~~~~~~~~ File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 411, in _perform_spawn reply.run() File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 341, in run self._result = func(*args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 1160, in _thread_receiver msg = Message.from_io(io) File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 567, in from_io header = io.read(9) # type 1, channel 4, payload 4 File "/usr/local/lib/python3.10/dist-packages/execnet/gateway_base.py", line 534, in read data = self._read(numbytes - len(buf)) ------------------------------ Captured log call ------------------------------- 2025-04-01 23:16:26 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica1 (cluster.py:3677, query) 2025-04-01 23:16:32 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica1 (cluster.py:3677, query) 2025-04-01 23:16:32 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS source SYNC on replica2 (cluster.py:3677, query) 2025-04-01 23:16:32 [ 665 ] DEBUG : Executing query DROP TABLE IF EXISTS destination SYNC on replica2 (cluster.py:3677, query) 2025-04-01 23:16:32 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:17:29 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:18:27 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:19:21 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:20:17 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:21:14 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:22:11 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:23:06 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:24:01 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:24:58 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:25:56 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:26:51 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:27:46 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:28:43 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:29:41 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) 2025-04-01 23:30:35 [ 665 ] DEBUG : Executing query ATTACH TABLE source UUID 'cf712b4f-2ca8-435c-ac23-c4393efe52f7' ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('other' = 0, 'terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4), is_new UInt8, duration Enum8('unknown' = 0, 'freehold' = 1, 'leasehold' = 2), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree() ORDER BY (postcode1, postcode2, addr1, addr2) SETTINGS disk = disk(type = web, endpoint = 'https://raw.githubusercontent.com/ClickHouse/web-tables-demo/main/web/') on replica1 (cluster.py:3677, query) ---------------------------- Captured log teardown ----------------------------- 2025-04-01 23:31:26 [ 665 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env --project-name roottestattachpartitionusingcopy-gw5 --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/docker-compose.yml stop --timeout 20] (cluster.py:122, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:122, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:122, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/.env --project-name roottestattachpartitionusingcopy-gw5 --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica1/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_attach_partition_using_copy/_instances-0-gw5/replica2/docker-compose.yml down --volumes] (cluster.py:122, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Removing (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Removing (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica2-1 Removed (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-replica1-1 Removed (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Stopping (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Removing (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Removing (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Stopped (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Removing (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo1-1 Removed (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo3-1 Removed (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Container roottestattachpartitionusingcopy-gw5-zoo2-1 Removed (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Network roottestattachpartitionusingcopy-gw5_default Removing (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Stderr: Network roottestattachpartitionusingcopy-gw5_default Removed (cluster.py:148, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Cleanup called (cluster.py:894, cleanup) 2025-04-01 23:31:38 [ 665 ] DEBUG : Docker networks for project roottestattachpartitionusingcopy-gw5 are NETWORK ID NAME DRIVER SCOPE (cluster.py:873, print_all_docker_pieces) 2025-04-01 23:31:38 [ 665 ] DEBUG : Docker containers for project roottestattachpartitionusingcopy-gw5 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:881, print_all_docker_pieces) 2025-04-01 23:31:38 [ 665 ] DEBUG : Docker volumes for project roottestattachpartitionusingcopy-gw5 are DRIVER VOLUME NAME (cluster.py:889, print_all_docker_pieces) 2025-04-01 23:31:38 [ 665 ] DEBUG : Command:[docker container list --all --filter name='^/roottestattachpartitionusingcopy-gw5-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:122, run_and_check) 2025-04-01 23:31:38 [ 665 ] DEBUG : Unstopped containers: {} (cluster.py:908, cleanup) 2025-04-01 23:31:38 [ 665 ] DEBUG : No running containers for project: roottestattachpartitionusingcopy-gw5 (cluster.py:922, cleanup) 2025-04-01 23:31:38 [ 665 ] DEBUG : Trying to prune unused networks... (cluster.py:928, cleanup) 2025-04-01 23:31:39 [ 665 ] DEBUG : Trying to prune unused images... (cluster.py:944, cleanup) 2025-04-01 23:31:39 [ 665 ] DEBUG : Command:[docker image prune -f] (cluster.py:122, run_and_check) 2025-04-01 23:31:39 [ 665 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:146, run_and_check) 2025-04-01 23:31:39 [ 665 ] DEBUG : Images pruned (cluster.py:947, cleanup) 2025-04-01 23:31:39 [ 665 ] DEBUG : Trying to prune unused volumes... (cluster.py:953, cleanup) 2025-04-01 23:31:39 [ 665 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:122, run_and_check) 2025-04-01 23:31:39 [ 665 ] DEBUG : Stdout:1 (cluster.py:146, run_and_check) 2025-04-01 23:31:39 [ 665 ] DEBUG : Volumes pruned: 1 (cluster.py:958, cleanup) ============================== slowest durations =============================== 900.00s call test_attach_partition_using_copy/test.py::test_not_work_on_different_disk 900.00s call test_attach_partition_using_copy/test.py::test_only_destination_replicated 900.00s call test_attach_partition_using_copy/test.py::test_both_mergetree 881.88s call test_attach_partition_using_copy/test.py::test_all_replicated 188.02s setup test_backup_restore_azure_blob_storage/test.py::test_backup_restore 162.73s setup test_azure_blob_storage_plain_rewritable/test.py::test_drop_table 131.26s call test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic 122.29s call test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log] 120.49s call test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree] 113.81s call test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree] 104.10s call test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree] 103.80s call test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree] 99.68s call test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas 95.80s call test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated 74.45s setup test_allowed_url_from_config/test.py::test_HDFS 71.85s call test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop 64.93s call test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop 61.31s call test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop 56.92s call test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool 53.32s call test_allowed_url_from_config/test.py::test_table_function_remote 41.78s call test_backward_compatibility/test_functions.py::test_aggregate_states 41.02s call test_MemoryTracking/test.py::test_tcp_multiple_sessions 40.23s call test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts 29.00s call test_MemoryTracking/test.py::test_tcp_single_session 28.71s call test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields 28.28s setup test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop 27.33s call test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes 25.97s setup test_broken_part_during_merge/test.py::test_merge_and_part_corruption 25.87s call test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node 25.58s teardown test_backup_s3_storage_class/test.py::test_backup_s3_storage_class 24.78s call test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000] 23.92s teardown test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool 22.12s teardown test_allowed_url_from_config/test.py::test_table_function_remote 21.88s teardown test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree] 21.58s call test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000] 21.03s setup test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts 21.02s setup test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool 20.84s call test_config_yaml_main/test.py::test_yaml_main_conf 20.66s setup test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field 20.03s setup test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes 19.99s setup test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop 19.96s setup test_catboost_evaluate/test.py::testAmazonModelManyRows 19.86s call test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg 19.82s setup test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table 19.56s setup test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async 19.30s setup test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg 19.19s call test_MemoryTracking/test.py::test_http 19.13s setup test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree] 18.71s teardown test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 18.68s setup test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge 18.14s setup test_attach_partition_using_copy/test.py::test_all_replicated 17.41s call test_config_xml_yaml_mix/test.py::test_extra_yaml_mix 17.30s call test_catboost_evaluate/test.py::testAmazonModelManyRows 16.66s setup test_backward_compatibility/test.py::test_backward_compatability1 16.44s setup test_backup_s3_storage_class/test.py::test_backup_s3_storage_class 15.35s call test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000] 14.89s call test_config_decryption/test_wrong_settings.py::test_no_encryption_key 14.66s teardown test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table 14.53s setup test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic 14.53s teardown test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path 14.26s setup test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact 14.04s call test_config_decryption/test_wrong_settings.py::test_invalid_chars 13.96s call test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000] 13.94s setup test_build_sets_from_multiple_threads/test.py::test_set 13.76s setup test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop 13.60s setup test_MemoryTracking/test.py::test_http 13.58s call test_config_decryption/test_wrong_settings.py::test_wrong_method 13.55s setup test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 13.25s teardown test_azure_blob_storage_plain_rewritable/test.py::test_restart_server 13.20s call test_config_decryption/test_wrong_settings.py::test_subnodes 13.04s setup test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column 12.68s setup test_backup_restore_storage_policy/test.py::test_storage_policies[None--default] 12.64s setup test_codec_encrypted/test.py::test_different_keys 12.58s teardown test_attach_partition_using_copy/test.py::test_only_destination_replicated 12.30s call test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool 11.60s call test_catboost_evaluate/test.py::testRecoveryAfterCrash 11.56s teardown test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000] 11.49s setup test_composable_protocols/test.py::test_connections 11.35s call test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact 10.97s setup test_backward_compatibility/test_functions.py::test_aggregate_states 10.93s setup test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility 10.67s setup test_concurrent_queries_for_user_restriction/test.py::test_exception_message 9.71s setup test_buffer_profile/test.py::test_buffer_profile 9.68s call test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree] 9.67s teardown test_backward_compatibility/test.py::test_backward_compatability1 9.28s call test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide 9.10s call test_azure_blob_storage_plain_rewritable/test.py::test_restart_server 8.55s call test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility 8.20s call test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path 8.06s call test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async 7.94s call test_broken_part_during_merge/test.py::test_merge_and_part_corruption 7.74s teardown test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async 7.60s setup test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields 7.49s call test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table 7.44s call test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids 7.39s call test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup 7.01s call test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table 6.53s teardown test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000] 6.34s teardown test_build_sets_from_multiple_threads/test.py::test_set 6.15s teardown test_backward_compatibility/test_functions.py::test_string_functions 6.09s teardown test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000] 6.00s teardown test_broken_part_during_merge/test.py::test_merge_and_part_corruption 5.93s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2] 5.91s call test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field 5.79s teardown test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field 5.68s teardown test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg 5.45s teardown test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge 5.26s teardown test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop 5.24s call test_build_sets_from_multiple_threads/test.py::test_set 5.12s teardown test_buffer_profile/test.py::test_default_profile 4.62s teardown test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop 4.56s teardown test_concurrent_queries_for_user_restriction/test.py::test_exception_message 4.47s teardown test_MemoryTracking/test.py::test_tcp_single_session 4.38s teardown test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000] 4.31s setup test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column 4.25s teardown test_codec_encrypted/test.py::test_different_keys 3.92s teardown test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts 3.70s teardown test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column 3.56s teardown test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide 3.50s teardown test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column 3.32s teardown test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields 3.27s teardown test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic 3.11s call test_codec_encrypted/test.py::test_different_keys 3.10s teardown test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop 3.00s teardown test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 2.96s call test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts 2.94s call test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh 2.73s call test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section 2.53s call test_azure_blob_storage_plain_rewritable/test.py::test_insert_select 2.51s teardown test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility 2.50s teardown test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table 2.47s teardown test_composable_protocols/test.py::test_connections 2.33s call test_allowed_url_from_config/test.py::test_config_without_allowed_hosts 2.09s call test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column 1.88s call test_allowed_url_from_config/test.py::test_config_with_hosts 1.85s call test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge 1.84s call test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default] 1.77s call test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree 1.71s call test_catboost_evaluate/test.py::testConstantFeatures 1.68s call test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column 1.66s teardown test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree] 1.61s teardown test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node 1.59s call test_catboost_evaluate/test.py::testModelPathIsNotAConstString 1.57s call test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1] 1.57s call test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2] 1.51s call test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts 1.46s call test_concurrent_queries_for_user_restriction/test.py::test_exception_message 1.46s call test_backup_restore_storage_policy/test.py::test_storage_policies[None--default] 1.43s call test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1] 1.40s call test_catboost_evaluate/test.py::testModelUpdate 1.37s call test_backup_restore_azure_blob_storage/test.py::test_backup_restore 1.36s call test_catboost_evaluate/test.py::testNonConstantFeatures 1.36s call test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1] 1.35s call test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container 1.34s teardown test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments 1.33s call test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 1.31s call test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default] 1.30s call test_catboost_evaluate/test.py::testAmazonModelSingleRow 1.26s call test_backup_s3_storage_class/test.py::test_backup_s3_storage_class 1.21s call test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 1.07s call test_buffer_profile/test.py::test_default_profile 0.98s call test_allowed_url_from_config/test.py::test_redirect 0.91s call test_composable_protocols/test.py::test_connections 0.89s call test_allowed_url_from_config/test.py::test_HDFS 0.86s call test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments 0.85s call test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric 0.81s call test_catboost_evaluate/test.py::testOnNullableFeatures 0.77s call test_catboost_evaluate/test.py::testInvalidLibraryPath 0.76s call test_azure_blob_storage_plain_rewritable/test.py::test_drop_table 0.76s call test_catboost_evaluate/test.py::testInvalidModelPath 0.75s call test_backward_compatibility/test.py::test_backward_compatability1 0.74s teardown test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree] 0.74s call test_buffer_profile/test.py::test_buffer_profile 0.73s teardown test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup 0.65s call test_backward_compatibility/test_functions.py::test_string_functions 0.64s teardown test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes 0.61s call test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString 0.60s call test_catboost_evaluate/test.py::testOnLowCardinalityFeatures 0.59s teardown test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log] 0.58s teardown test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree] 0.58s teardown test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree] 0.55s call test_allowed_url_from_config/test.py::test_schema_inference 0.28s call test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 0.27s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default] 0.21s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1] 0.17s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1] 0.17s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1] 0.17s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default] 0.17s teardown test_backup_restore_storage_policy/test.py::test_storage_policies[None--default] 0.05s setup test_config_decryption/test_wrong_settings.py::test_invalid_chars 0.03s teardown test_allowed_url_from_config/test.py::test_HDFS 0.02s setup test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree] 0.01s teardown test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool 0.00s teardown test_config_yaml_main/test.py::test_yaml_main_conf 0.00s teardown test_catboost_evaluate/test.py::testAmazonModelManyRows 0.00s setup test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000] 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup 0.00s setup test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree] 0.00s teardown test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact 0.00s setup test_config_xml_yaml_mix/test.py::test_extra_yaml_mix 0.00s teardown test_backward_compatibility/test_functions.py::test_aggregate_states 0.00s setup test_config_yaml_main/test.py::test_yaml_main_conf 0.00s setup test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000] 0.00s teardown test_MemoryTracking/test.py::test_tcp_multiple_sessions 0.00s teardown test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated 0.00s setup test_config_decryption/test_wrong_settings.py::test_subnodes 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log] 0.00s teardown test_backup_restore_azure_blob_storage/test.py::test_backup_restore 0.00s setup test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool 0.00s setup test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path 0.00s setup test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container 0.00s setup test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 0.00s setup test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000] 0.00s setup test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas 0.00s teardown test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas 0.00s teardown test_MemoryTracking/test.py::test_http 0.00s setup test_MemoryTracking/test.py::test_tcp_single_session 0.00s teardown test_config_xml_yaml_mix/test.py::test_extra_yaml_mix 0.00s setup test_MemoryTracking/test.py::test_tcp_multiple_sessions 0.00s setup test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide 0.00s setup test_backward_compatibility/test_functions.py::test_string_functions 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree] 0.00s teardown test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids 0.00s teardown test_azure_blob_storage_plain_rewritable/test.py::test_drop_table 0.00s teardown test_config_decryption/test_wrong_settings.py::test_no_encryption_key 0.00s setup test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1] 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node 0.00s setup test_allowed_url_from_config/test.py::test_config_without_allowed_hosts 0.00s setup test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids 0.00s setup test_allowed_url_from_config/test.py::test_config_with_hosts 0.00s setup test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default] 0.00s setup test_catboost_evaluate/test.py::testAmazonModelSingleRow 0.00s teardown test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section 0.00s teardown test_attach_partition_using_copy/test.py::test_all_replicated 0.00s teardown test_catboost_evaluate/test.py::testConstantFeatures 0.00s setup test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree 0.00s teardown test_buffer_profile/test.py::test_buffer_profile 0.00s setup test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1] 0.00s setup test_azure_blob_storage_plain_rewritable/test.py::test_insert_select 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree] 0.00s setup test_catboost_evaluate/test.py::testRecoveryAfterCrash 0.00s setup test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000] 0.00s setup test_config_decryption/test_wrong_settings.py::test_wrong_method 0.00s setup test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1] 0.00s setup test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2] 0.00s setup test_azure_blob_storage_plain_rewritable/test.py::test_restart_server 0.00s teardown test_catboost_evaluate/test.py::testOnNullableFeatures 0.00s setup test_allowed_url_from_config/test.py::test_redirect 0.00s setup test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default] 0.00s setup test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric 0.00s setup test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments 0.00s setup test_catboost_evaluate/test.py::testInvalidModelPath 0.00s teardown test_allowed_url_from_config/test.py::test_config_with_hosts 0.00s setup test_catboost_evaluate/test.py::testInvalidLibraryPath 0.00s teardown test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree 0.00s setup test_catboost_evaluate/test.py::testOnLowCardinalityFeatures 0.00s teardown test_config_decryption/test_wrong_settings.py::test_invalid_chars 0.00s setup test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table 0.00s setup test_buffer_profile/test.py::test_default_profile 0.00s setup test_allowed_url_from_config/test.py::test_schema_inference 0.00s teardown test_catboost_evaluate/test.py::testInvalidLibraryPath 0.00s setup test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh 0.00s setup test_catboost_evaluate/test.py::testModelUpdate 0.00s setup test_catboost_evaluate/test.py::testOnNullableFeatures 0.00s teardown test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 0.00s setup test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 0.00s teardown test_config_decryption/test_wrong_settings.py::test_wrong_method 0.00s setup test_catboost_evaluate/test.py::testNonConstantFeatures 0.00s teardown test_azure_blob_storage_plain_rewritable/test.py::test_insert_select 0.00s teardown test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts 0.00s setup test_config_decryption/test_wrong_settings.py::test_no_encryption_key 0.00s setup test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString 0.00s teardown test_allowed_url_from_config/test.py::test_config_without_allowed_hosts 0.00s setup test_allowed_url_from_config/test.py::test_table_function_remote 0.00s teardown test_attach_partition_using_copy/test.py::test_both_mergetree 0.00s teardown test_config_decryption/test_wrong_settings.py::test_subnodes 0.00s setup test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts 0.00s setup test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts 0.00s setup test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section 0.00s teardown test_attach_partition_using_copy/test.py::test_not_work_on_different_disk 0.00s teardown test_catboost_evaluate/test.py::testNonConstantFeatures 0.00s teardown test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh 0.00s teardown test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric 0.00s setup test_attach_partition_using_copy/test.py::test_both_mergetree 0.00s teardown test_catboost_evaluate/test.py::testRecoveryAfterCrash 0.00s teardown test_allowed_url_from_config/test.py::test_redirect 0.00s setup test_attach_partition_using_copy/test.py::test_not_work_on_different_disk 0.00s setup test_catboost_evaluate/test.py::testModelPathIsNotAConstString 0.00s setup test_catboost_evaluate/test.py::testConstantFeatures 0.00s teardown test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts 0.00s teardown test_catboost_evaluate/test.py::testAmazonModelSingleRow 0.00s teardown test_catboost_evaluate/test.py::testModelPathIsNotAConstString 0.00s teardown test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container 0.00s setup test_attach_partition_using_copy/test.py::test_only_destination_replicated 0.00s teardown test_allowed_url_from_config/test.py::test_schema_inference 0.00s teardown test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString 0.00s teardown test_catboost_evaluate/test.py::testOnLowCardinalityFeatures 0.00s teardown test_catboost_evaluate/test.py::testModelUpdate 0.00s teardown test_catboost_evaluate/test.py::testInvalidModelPath =========================== short test summary info ============================ FAILED test_attach_partition_using_copy/test.py::test_all_replicated - Failed... FAILED test_attach_partition_using_copy/test.py::test_both_mergetree - Failed... FAILED test_attach_partition_using_copy/test.py::test_not_work_on_different_disk FAILED test_attach_partition_using_copy/test.py::test_only_destination_replicated PASSED test_config_decryption/test_wrong_settings.py::test_invalid_chars PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[None--default] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1] PASSED test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2] PASSED test_config_decryption/test_wrong_settings.py::test_no_encryption_key PASSED test_MemoryTracking/test.py::test_http PASSED test_catboost_evaluate/test.py::testAmazonModelManyRows PASSED test_catboost_evaluate/test.py::testAmazonModelSingleRow PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg PASSED test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString PASSED test_catboost_evaluate/test.py::testConstantFeatures PASSED test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric PASSED test_config_decryption/test_wrong_settings.py::test_subnodes PASSED test_catboost_evaluate/test.py::testInvalidLibraryPath PASSED test_catboost_evaluate/test.py::testInvalidModelPath PASSED test_catboost_evaluate/test.py::testModelPathIsNotAConstString PASSED test_catboost_evaluate/test.py::testModelUpdate PASSED test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes PASSED test_catboost_evaluate/test.py::testNonConstantFeatures PASSED test_catboost_evaluate/test.py::testOnLowCardinalityFeatures PASSED test_catboost_evaluate/test.py::testOnNullableFeatures PASSED test_config_decryption/test_wrong_settings.py::test_wrong_method PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000] PASSED test_catboost_evaluate/test.py::testRecoveryAfterCrash PASSED test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh PASSED test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments PASSED test_MemoryTracking/test.py::test_tcp_multiple_sessions PASSED test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node PASSED test_allowed_url_from_config/test.py::test_HDFS PASSED test_allowed_url_from_config/test.py::test_config_with_hosts PASSED test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool PASSED test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts PASSED test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts PASSED test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table PASSED test_allowed_url_from_config/test.py::test_config_without_allowed_hosts PASSED test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section PASSED test_allowed_url_from_config/test.py::test_redirect PASSED test_allowed_url_from_config/test.py::test_schema_inference PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000] PASSED test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact PASSED test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path PASSED test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide PASSED test_MemoryTracking/test.py::test_tcp_single_session PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000] PASSED test_buffer_profile/test.py::test_buffer_profile PASSED test_buffer_profile/test.py::test_default_profile PASSED test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree] PASSED test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000] PASSED test_allowed_url_from_config/test.py::test_table_function_remote PASSED test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge PASSED test_backward_compatibility/test_functions.py::test_aggregate_states PASSED test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility PASSED test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated PASSED test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async PASSED test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column PASSED test_backward_compatibility/test.py::test_backward_compatability1 PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree] PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore PASSED test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields PASSED test_azure_blob_storage_plain_rewritable/test.py::test_drop_table PASSED test_azure_blob_storage_plain_rewritable/test.py::test_insert_select PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 PASSED test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 PASSED test_azure_blob_storage_plain_rewritable/test.py::test_restart_server PASSED test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field PASSED test_backup_s3_storage_class/test.py::test_backup_s3_storage_class PASSED test_build_sets_from_multiple_threads/test.py::test_set PASSED test_config_yaml_main/test.py::test_yaml_main_conf PASSED test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts PASSED test_broken_part_during_merge/test.py::test_merge_and_part_corruption PASSED test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas PASSED test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool PASSED test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop PASSED test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log] PASSED test_composable_protocols/test.py::test_connections PASSED test_codec_encrypted/test.py::test_different_keys PASSED test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic PASSED test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop PASSED test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column PASSED test_concurrent_queries_for_user_restriction/test.py::test_exception_message PASSED test_config_xml_yaml_mix/test.py::test_extra_yaml_mix PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree] PASSED test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup PASSED test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table SKIPPED [1] test_backward_compatibility/test_functions.py:164: The test is slow in builds with sanitizer SKIPPED [1] test_asynchronous_metric_jemalloc_profile_active/test.py:30: Disabled for sanitizers ============= 4 failed, 94 passed, 2 skipped in 3616.24s (1:00:16) ============= Traceback (most recent call last): File "/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration/./runner", line 528, in subprocess.check_call(cmd, shell=True) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_pcfdyn --privileged --dns-search='.' --memory=30709030912 --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=6712d5cc610d -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=caad4729259e -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=0 --color=no --durations=0 test_MemoryTracking/test.py::test_http test_MemoryTracking/test.py::test_tcp_multiple_sessions test_MemoryTracking/test.py::test_tcp_single_session 'test_aliases_in_default_expr_not_break_table_structure/test.py::test_aliases_in_default_expr_not_break_table_structure[ReplicatedMergeTree]' test_allowed_url_from_config/test.py::test_HDFS test_allowed_url_from_config/test.py::test_config_with_hosts test_allowed_url_from_config/test.py::test_config_with_only_primary_hosts test_allowed_url_from_config/test.py::test_config_with_only_regexp_hosts test_allowed_url_from_config/test.py::test_config_without_allowed_hosts test_allowed_url_from_config/test.py::test_config_without_allowed_hosts_section test_allowed_url_from_config/test.py::test_redirect test_allowed_url_from_config/test.py::test_schema_inference test_allowed_url_from_config/test.py::test_table_function_remote test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_asynchronous_metric_log_table/test.py::test_event_time_microseconds_field test_asynchronous_metrics_pk_bytes_fields/test.py::test_total_pk_bytes_in_memory_fields test_attach_partition_using_copy/test.py::test_all_replicated test_attach_partition_using_copy/test.py::test_both_mergetree test_attach_partition_using_copy/test.py::test_not_work_on_different_disk test_attach_partition_using_copy/test.py::test_only_destination_replicated test_azure_blob_storage_plain_rewritable/test.py::test_drop_table test_azure_blob_storage_plain_rewritable/test.py::test_insert_select test_azure_blob_storage_plain_rewritable/test.py::test_restart_server test_backup_restore_azure_blob_storage/test.py::test_backup_restore test_backup_restore_azure_blob_storage/test.py::test_backup_restore_correct_block_ids test_backup_restore_azure_blob_storage/test.py::test_backup_restore_diff_container test_backup_restore_azure_blob_storage/test.py::test_backup_restore_on_merge_tree test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf1 test_backup_restore_azure_blob_storage/test.py::test_backup_restore_with_named_collection_azure_conf2 test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_different_nodes test_backup_restore_on_cluster/test_concurrency.py::test_concurrent_backups_on_same_node 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Atomic-MergeTree]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Lazy-Log]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Memory-MergeTree]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Ordinary-MergeTree]' 'test_backup_restore_on_cluster/test_concurrency.py::test_create_or_drop_tables_during_backup[Replicated-ReplicatedMergeTree]' test_backup_restore_on_cluster/test_concurrency.py::test_kill_mutation_during_backup test_backup_restore_on_cluster/test_concurrency.py::test_replicated_table test_backup_restore_on_cluster/test_slow_rmt.py::test_replicated_database_async test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_replicated_table test_backup_restore_on_cluster/test_two_shards_two_replicas.py::test_two_tables_with_uuid_in_zk_path 'test_backup_restore_storage_policy/test.py::test_storage_policies[None--default]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[None-None-default]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[None-policy1-policy1]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1--default]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-None-policy1]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy1-policy1]' 'test_backup_restore_storage_policy/test.py::test_storage_policies[policy1-policy2-policy2]' test_backup_s3_storage_class/test.py::test_backup_s3_storage_class test_backward_compatibility/test.py::test_backward_compatability1 test_backward_compatibility/test_aggregate_fixed_key.py::test_two_level_merge test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_avg 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[1000]' 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact[500000]' 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[1000]' 'test_backward_compatibility/test_aggregate_function_state.py::test_backward_compatability_for_uniq_exact_variadic[500000]' test_backward_compatibility/test_convert_ordinary.py::test_convert_ordinary_to_atomic test_backward_compatibility/test_functions.py::test_aggregate_states test_backward_compatibility/test_functions.py::test_string_functions test_backward_compatibility/test_ip_types_binary_compatibility.py::test_ip_types_binary_compatibility test_backward_compatibility/test_normalized_count_comparison.py::test_select_aggregate_alias_column test_backward_compatibility/test_select_aggregate_alias_column.py::test_select_aggregate_alias_column test_backward_compatibility/test_vertical_merges_from_compact_parts.py::test_vertical_merges_from_compact_parts test_broken_part_during_merge/test.py::test_merge_and_part_corruption test_buffer_profile/test.py::test_buffer_profile test_buffer_profile/test.py::test_default_profile test_build_sets_from_multiple_threads/test.py::test_set test_catboost_evaluate/test.py::testAmazonModelManyRows test_catboost_evaluate/test.py::testAmazonModelSingleRow test_catboost_evaluate/test.py::testCategoricalFeatureMustBeNumericOrString test_catboost_evaluate/test.py::testConstantFeatures test_catboost_evaluate/test.py::testFloatFeatureMustBeNumeric test_catboost_evaluate/test.py::testInvalidLibraryPath test_catboost_evaluate/test.py::testInvalidModelPath test_catboost_evaluate/test.py::testModelPathIsNotAConstString test_catboost_evaluate/test.py::testModelUpdate test_catboost_evaluate/test.py::testNonConstantFeatures test_catboost_evaluate/test.py::testOnLowCardinalityFeatures test_catboost_evaluate/test.py::testOnNullableFeatures test_catboost_evaluate/test.py::testRecoveryAfterCrash test_catboost_evaluate/test.py::testSystemModelsAndModelRefresh test_catboost_evaluate/test.py::testWrongNumberOfFeatureArguments test_cluster_discovery/test.py::test_cluster_discovery_startup_and_stop test_cluster_discovery/test_auxiliary_keeper.py::test_cluster_discovery_with_auxiliary_keeper_startup_and_stop test_cluster_discovery/test_dynamic_clusters.py::test_cluster_discovery_startup_and_stop test_codec_encrypted/test.py::test_different_keys test_composable_protocols/test.py::test_connections test_compressed_marks_restart/test.py::test_compressed_marks_restart_compact test_compressed_marks_restart/test.py::test_compressed_marks_restart_wide test_concurrent_queries_for_user_restriction/test.py::test_exception_message test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_in_empty_pool_replicated test_concurrent_ttl_merges/test.py::test_limited_ttl_merges_two_replicas test_concurrent_ttl_merges/test.py::test_no_ttl_merges_in_busy_pool test_config_decryption/test_wrong_settings.py::test_invalid_chars test_config_decryption/test_wrong_settings.py::test_no_encryption_key test_config_decryption/test_wrong_settings.py::test_subnodes test_config_decryption/test_wrong_settings.py::test_wrong_method test_config_xml_yaml_mix/test.py::test_extra_yaml_mix test_config_yaml_main/test.py::test_yaml_main_conf -vvv" altinityinfra/integration-tests-runner:cd6390247eca ' returned non-zero exit status 1.